Guess-Averse Loss Functions For Cost-Sensitive Multiclass Boosting

نویسندگان

  • Oscar Beijbom
  • Mohammad J. Saberian
  • David J. Kriegman
  • Nuno Vasconcelos
چکیده

Cost-sensitive multiclass classification has recently acquired significance in several applications, through the introduction of multiclass datasets with well-defined misclassification costs. The design of classification algorithms for this setting is considered. It is argued that the unreliable performance of current algorithms is due to the inability of the underlying loss functions to enforce a certain fundamental underlying property. This property, denoted guess-aversion, is that the loss should encourage correct classifications over the arbitrary guessing that ensues when all classes are equally scored by the classifier. While guess-aversion holds trivially for binary classification, this is not true in the multiclass setting. A new family of cost-sensitive guess-averse loss functions is derived, and used to design new cost-sensitive multiclass boosting algorithms, denoted GELand GLL-MCBoost. Extensive experiments demonstrate (1) the importance of guess-aversion and (2) that the GLL loss function outperforms other loss functions for multiclass boosting.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Totally Corrective Multiclass Boosting with Binary Weak Learners

In this work, we propose a new optimization framework for multiclass boosting learning. In the literature, AdaBoost.MO and AdaBoost.ECC are the two successful multiclass boosting algorithms, which can use binary weak learners. We explicitly derive these two algorithms’ Lagrange dual problems based on their regularized loss functions. We show that the Lagrange dual formulations enable us to desi...

متن کامل

Cost-sensitive Boosting with p-norm Loss Functions and its Applications

In practical applications of classification, there are often varying costs associated with different types of misclassification (e.g. fraud detection, anomaly detection and medical diagnosis), motivating the need for the so-called ”cost-sensitive” classification. In this paper, we introduce a family of novel boosting methods for cost-sensitive classification by applying the theory of gradient b...

متن کامل

Multi-Resolution Cascades for Multiclass Object Detection

An algorithm for learning fast multiclass object detection cascades is introduced. It produces multi-resolution (MRes) cascades, whose early stages are binary target vs. non-target detectors that eliminate false positives, late stages multiclass classifiers that finely discriminate target classes, and middle stages have intermediate numbers of classes, determined in a data-driven manner. This M...

متن کامل

Transforming examples for multiclass boosting

AdaBoost.M2 and AdaBoost.MH are boosting algorithms for learning from multiclass datasets. They have received less attention than other boosting algorithms because they require base classifiers that can handle the pseudoloss or Hamming loss, respectively. The difficulty with these loss functions is that each example is associated with k weights, where k is the number of classes. We address this...

متن کامل

Sensitive Error Correcting Output Codes

We present a reduction from cost sensitive classi cation to binary classi cation based on (a modi cation of) error correcting output codes. The reduction satis es the property that regret for binary classi cation implies l2-regret of at most 2 for cost-estimation. This has several implications: 1) Any regret-minimizing online algorithm for 0/1 loss is (via the reduction) a regret-minimizing onl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014